Projected Newton-type Methods in Machine Learning
نویسندگان
چکیده
We consider projected Newton-type methods for solving large-scale optimization problems arising in machine learning and related fields. We first introduce an algorithmic framework for projected Newton-type methods by reviewing a canonical projected (quasi-)Newton method. This method, while conceptually pleasing, has a high computation cost per iteration. Thus, we discuss two variants that are more scalable, namely, two-metric projection and inexact projection methods. Finally, we show how to apply the Newton-type framework to handle non-smooth objectives. Examples are provided throughout the chapter to illustrate machine learning applications of our framework.
منابع مشابه
A New Projected Quasi-Newton Approach for the Nonnegative Least Squares Problem
Constrained least squares estimation lies at the heart of many applications in fields as diverse as statistics, psychometrics, signal processing, or even machine learning. Nonnegativity requirements on the model variables are amongst the simplest constraints that arise naturally, and the corresponding least-squares problem is called Nonnegative Least Squares or NNLS. In this paper we present a ...
متن کاملA Fast Dual Projected Newton Method for l1-Regularized Least Squares
L1-regularized least squares, with the ability of discovering sparse representations, is quite prevalent in the field of machine learning, statistics and signal processing. In this paper, we propose a novel algorithm called Dual Projected Newton Method (DPNM) to solve the 1-regularized least squares problem. In DPNM, we first derive a new dual problem as a box constrained quadratic programming....
متن کاملGNMF with Newton-Based Methods
Several variants of Nonnegative Matrix Factorization (NMF) have been proposed for supervised classification of various objects. Graph regularized NMF (GNMF) incorporates the information on the data geometric structure to the training process, which considerably improves the classification results. However, the multiplicative algorithms used for updating the underlying factors may result in a sl...
متن کاملProximal Newton-type methods for convex optimization
We seek to solve convex optimization problems in composite form: minimize x∈Rn f(x) := g(x) + h(x), where g is convex and continuously differentiable and h : R → R is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. We derive a generalization of Newton-type methods to handle such convex but nonsmooth objective functions. We prove such met...
متن کاملRevisiting Sub-sampled Newton Methods
Many machine learning models depend on solving a large scale optimization problem. Recently, sub-sampled Newton methods have emerged to attract much attention for optimization due to their efficiency at each iteration, rectified a weakness in the ordinary Newton method of suffering a high cost at each iteration while commanding a high convergence rate. In this work we propose two new efficient ...
متن کامل